and (ݔ, ݕ) as shown in Table 4.1. If a regression model has been

ed for them, there will be three model outputs as well. Note that

d y are experimental data, but is not. For each pair,

ݕො

ݕො takes three

r three values of x, i.e., ߙ൅ ߚݔ, or ߙ൅ ߚݔ or ߙ൅ ߚݔ. The

n model outputs are individually denoted by ݕො, ݕො and ݕො for

ݕො is a vector of the regressed means. The final regression

s an interpolated curve based on these regressed means. Between

there are three regression errors. They are denoted by ߝ, ߝand

squared sum is called the sum of squared error or the total

n error and is denoted by ߝൌߝ

൅ߝ

ߝ

in this case.

The correspondence between an independent variable x, a dependent variable y,

outputs (the regressed means) and the regression errors. The squared errors are

ed.

Observation

Model output

Error

Squared error

ݕ

ݕො

ߝൌݕെݕො

ߝ

ݕ

ݕො

ߝൌݕെݕො

ߝ

ݕ

ݕො

ߝൌݕെݕො

ߝ

he illustration of the regression errors. The open dots apart from the solid line

e observations (y). The filled dots on the solid line stand for the regressed means

ns or model outputs (ݕො). The solid line stands for the regression function, i.e.,

ated function based on the regressed means. The vertical dashed lines stand for

on errors (distances) between the observations and the regressed means.

e 4.5 shows an example where there are seven data points, seven

tputs (predictions or regressed means) and seven errors between